Inference for Eigenvalues and Eigenvectors of Gaussian Symmetric Matrices By

نویسندگان

  • ARMIN SCHWARTZMAN
  • WALTER F. MASCARENHAS
  • JONATHAN E. TAYLOR
چکیده

This article presents maximum likelihood estimators (MLEs) and loglikelihood ratio (LLR) tests for the eigenvalues and eigenvectors of Gaussian random symmetric matrices of arbitrary dimension, where the observations are independent repeated samples from one or two populations. These inference problems are relevant in the analysis of diffusion tensor imaging data and polarized cosmic background radiation data, where the observations are, respectively, 3 × 3 and 2 × 2 symmetric positive definite matrices. The parameter sets involved in the inference problems for eigenvalues and eigenvectors are subsets of Euclidean space that are either affine subspaces, embedded submanifolds that are invariant under orthogonal transformations or polyhedral convex cones. We show that for a class of sets that includes the ones considered in this paper, the MLEs of the mean parameter do not depend on the covariance parameters if and only if the covariance structure is orthogonally invariant. Closed-form expressions for the MLEs and the associated LLRs are derived for this covariance structure.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Inference for Eigenvalues and Eigenvectors of Gaussian Symmetric Matrices

This article presents maximum likelihood estimators (MLEs) and log-likelihood ratio (LLR) tests for the eigenvalues and eigenvectors of Gaussian random symmetric matrices of arbitrary dimension, where the observations are independent repeated samples from one or two populations. These inference problems are relevant in the analysis of diffusion tensor imaging data and polarized cosmic backgroun...

متن کامل

Group Comparison of Eigenvalues and Eigenvectors of Diffusion Tensors

Diffusion tensor imaging (DTI) data differ from most medical images in that values at each voxel are not scalars, but 3×3 symmetric positive definite matrices called diffusion tensors (DTs). The anatomic characteristics of the tissue at each voxel are reflected by the DT eigenvalues and eigenvectors. In this article we consider the problem of testing whether the means of two groups of DT images...

متن کامل

II.G Gaussian Integrals

It can be reduced to a product of N one dimensional integrals by diagonalizing the matrix K ≡ Ki,j . Since we need only consider symmetric matrices (Ki,j = Kj,i), the eigenvalues are real, and the eigenvectors can be made orthonormal. Let us denote the eigenvectors and eigenvalues of K by q̂ and Kq respectively, i.e. Kq̂ = Kqq̂. The vectors {q̂} form a new coordinate basis in the original N dimensi...

متن کامل

II.G Gaussian Integrals

It can be reduced to a product of N one dimensional integrals by diagonalizing the matrix K ≡ Ki,j . Since we need only consider symmetric matrices (Ki,j = Kj,i), the eigenvalues are real, and the eigenvectors can be made orthonormal. Let us denote the eigenvectors and eigenvalues of K by q̂ and Kq respectively, i.e. Kq̂ = Kqq̂. The vectors {q̂} form a new coordinate basis in the original N dimensi...

متن کامل

Pseudo-Gaussian Inference in Heterokurtic Elliptical Common Principal Components Models

The so-called Common Principal Components (CPC) Model, in which the covariance matrices Σi of m populations are assumed to have identical eigenvectors, was introduced by Flury (1984), who develops Gaussian parametric inference methods for this model (Gaussian maximum likelihood estimation and Gaussian likelihood ratio testing). A key result in that context is the joint asymptotic normality of t...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2008